翻訳と辞書
Words near each other
・ Inframammary fold
・ Inframat Corporation
・ Infranet
・ Infranord
・ Infraorbital
・ Information Technology Park, Nepal
・ Information technology planning
・ Information Technology Professional Examination Council
・ Information Technology Security Assessment
・ Information technology security audit
・ Information technology services agency
・ Information technology specialist
・ Information Technology Task Force
・ Information Technology University
・ Information theory
Information theory and measure theory
・ Information therapy
・ Information Today, Inc.
・ Information transfer
・ Information transfer node
・ Information Tribunal
・ Information Trust Institute
・ Information Ukraine
・ Information visualization
・ Information Visualization (journal)
・ Information visualization reference model
・ Information wants to be free
・ Information war during the Russo-Georgian War
・ Information warfare
・ Information Warfare Monitor


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Information theory and measure theory : ウィキペディア英語版
Information theory and measure theory
This article discusses how information theory (a branch of mathematics studying the transmission, processing and storage of information) is related to measure theory (a branch of mathematics related to integration and probability).
== Measures in information theory ==
Many of the concepts in information theory have separate definitions and formulas for continuous and discrete cases. For example, entropy H(X) is usually defined for discrete random variables, whereas for continuous random variables the related concept of differential entropy, written h(X), is used (see Cover and Thomas, 2006, chapter 8). Both these concepts are mathematical expectations, but the expectation is defined with an integral for the continuous case, and a sum for the discrete case.
These separate definitions can be more closely related in terms of measure theory. For discrete random variables, probability mass functions can be considered density functions with respect to the counting measure. Thinking of both the integral and the sum as integration on a measure space allows for a unified treatment.
Consider the formula for the differential entropy of a continuous random variable X with range \mathbb and probability density function f(x):
: h(X) = -\int_\mathbb f(x) \log f(x) \,dx.
This can usually be interpreted as the following Riemann-Stieltjes integral:
: h(X) = -\int_\mathbb f(x) \log f(x) \,d\mu(x),
where \mu is the Lebesgue measure.
If instead, X is discrete, with range \Omega a finite set, f is a probability mass function on \Omega, and \nu is the counting measure on \Omega, we can write:
: H(X) = -\sum_ f(x) \log f(x) = -\int_\Omega f(x) \log f(x) \,d\nu(x).
The integral expression and the general concept is identical to the continuous case; the only difference is the measure used. In both cases the probability density function f is the Radon–Nikodym derivative of the probability measure with respect to the measure against which the integral is taken.
If \mathbb P is the probability measure on X, then the integral can also be taken directly with respect to \mathbb P :
: h(X) = -\int_X \log \frac \,d\mathbb P,
If instead of the underlying measure μ we take another probability measure \mathbb Q , we are led to the Kullback–Leibler divergence: let \mathbb P and \mathbb Q be probability measures over the same space. Then if \mathbb P is absolutely continuous with respect to \mathbb Q, written \mathbb P << \mathbb Q, the Radon–Nikodym derivative \frac exists and the Kullback–Leibler divergence can be expressed in its full generality:
:D_\mathrm(\mathbb P \| \mathbb Q)
= \int_
\frac
\log \frac
\,d \mathbb Q
= \int_
\log \frac
\,d \mathbb P,

where the integral runs over the support of \mathbb P. Note that we have dropped the negative sign: the Kullback–Leibler divergence is always non-negative due to Gibbs' inequality.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Information theory and measure theory」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.